Fisher Linear Discriminant Analysis
نویسندگان
چکیده
Fisher Linear Discriminant Analysis (also called Linear Discriminant Analysis(LDA)) are methods used in statistics, pattern recognition and machine learning to find a linear combination of features which characterizes or separates two or more classes of objects or events. The resulting combination may be used as a linear classifier, or, more commonly, for dimensionality reduction before later classification. LDA is closely related to PCA, for both of them are based on linear, i.e. matrix multiplication, transformations. For the case of PCA, the transformation is baed on minimizing mean square error between original data vectors and data vectors that can be estimated fro the reduced dimensionality data vectors. And the PCA does not take into account any difference in class. But for the case of LDA, the transformation is based on maximizing a ratio of “between-class variance” to “within-class variance” with the goal of reducing data variation in the same class and increasing the separation between classes. Let’s see an example of LDA as below(Figure1):
منابع مشابه
Robust Fisher Discriminant Analysis
Fisher linear discriminant analysis (LDA) can be sensitive to the problem data. Robust Fisher LDA can systematically alleviate the sensitivity problem by explicitly incorporating a model of data uncertainty in a classification problem and optimizing for the worst-case scenario under this model. The main contribution of this paper is show that with general convex uncertainty models on the proble...
متن کاملA Model Classification Technique for Linear Discriminant Analysis for Two Groups
Linear discriminant analysis introduced by Fisher is a known dimension reduction and classification approach that has received much attention in the statistical literature. Most researchers have focused attention on its deficiencies. As such different versions of classification procedures have been introduced for various applications. In this paper, we attempt not to robustify the Fisher linear...
متن کاملError bounds for Kernel Fisher Linear Discriminant in Gaussian Hilbert space
We give a non-trivial, non-asymptotic upper bound on the classification error of the popular Kernel Fisher Linear Discriminant classifier under the assumption that the kernelinduced space is a Gaussian Hilbert space.
متن کاملInformative Discriminant Analysis
We introduce a probabilistic model that generalizes classical linear discriminant analysis and gives an interpretation for the components as informative or relevant components of data. The components maximize the predictability of class distribution which is asymptotically equivalent to (i) maximizing mutual information with the classes, and (ii) finding principal components in the so-called le...
متن کاملRapid and Brief Communication Alternative linear discriminant classi$er
Fisher linear discriminant analysis (FLDA) $nds a set of optimal discriminating vectors by maximizing Fisher criterion, i.e., the ratio of the between scatter to the within scatter. One of its major disadvantages is that the number of its discriminating vectors capable to be found is bounded from above by C-1 for C-class problem. In this paper for binary-class problem, we propose alternative FL...
متن کاملMulticlass Linear Dimension Reduction by Weighted Pairwise Fisher Criteria
ÐWe derive a class of computationally inexpensive linear dimension reduction criteria by introducing a weighted variant of the well-known K-class Fisher criterion associated with linear discriminant analysis (LDA). It can be seen that LDA weights contributions of individual class pairs according to the Euclidian distance of the respective class means. We generalize upon LDA by introducing a dif...
متن کامل